JOVIAL: Stand up Schwartz

Information on the web about JOVIAL is rather scarce. Came across some old grainy footage of a mpeg video of Jules Schwartz giving an amusing speech for those interested in the early pioneers of PLs.

Guy Steele on Programming Languages

Things always seem to slow down as we approach this time of year, so I'll post a OOPSLA 2007 video interview of Guy Steele on Programming Languages that I stumbled upon. Covers a range of topics that are of interest to the current and future state of PLs. Nothing too technically deep, but tidbits of interest scattered throughout the interview (especially on DSLs).

(see also prior LtU discussion Guy Steele on Language Design).

(Maybe others have seen this before but I really like the interface that is provided for the video. The list of questions are clickable links which move you to that place in the interview where the question was asked. I hope this catches on for all technical video interviews.)

The Little Books in Oz

Translating code from one programming language to another is a black art. Even if successful in capturing functionality, each PL has its own styles, idioms and community morals. Doing automated translations (which I have done) has more misses than hits. Doing it manually gets you closer but it can require an inordinate amount of time to get it just right. Even so, PL translations are something that I personally enjoy as it is particularly instructive in teaching the strengths and limitations of expressing different concepts (though I usually catch flak for violating the social values of the target language).

My latest postings into this gray area are translations of the remaining Little Books to Oz - consisting of The Little Schemer, The Seasoned Schemer, The Little MLer and A Little Java, A Few Patterns (previous LtU post on The Reasoned Schemer in Oz). The Little Books are the antithesis to recipe books. There's not much code here that can be plugged into a project. The aim is to systematically teach programming thought processes. The books are useful for those wanting to learn Scheme (or ML). But the lessons are also useful even if those are not your particular language(s) of choice. Such didactic material may not be everyone's cup of tea, but they do represent a unique manner in which to teach (and still hoping for The Little Haskeller).

Along a similar line, I've started in on Introduction to Algorithms in Oz. Previously, I made a weak attempt at Knuth which I'll get back to one of these years, but found that translating MIX to higher level languages was tedious and time consuming. The CLRS book is a bit easier to translate, but the language they chose to express algorithms in doesn't seem to map to any exact known programming language in the universe. The language is concise, which was their aim, but it takes some shortcuts and has some peculiarities. Also, like Knuth, the algorithms are very much oriented to having mutable state. (Purely Functional Data Structures is in my queue). Anyhow, I find it interesting that the authors of the two best known book(s) on algorithms chose to invent their own language rather than use an existing PL.

Happy Birthday, dear Lambda!

Eight years ago LtU was born. In terms of internet phenomena that is truly remarkable longevity (heck, even google is hardly ten years old), but even in "real world" terms I think this qualifies LtU as a venerable institution.

Ever since Chris posted the first LtU birtday message back in 2001 it has become somewhat of a tradition to post a birthday message each year, and these have become a good place to reflect on the sate of LtU and the direction it is taking. Looking at the previous birthday posts sure got me reflecting. Here they are: year one, year two, year three, year five, year six, and year seven.

So how did we fare this last year? I think that overall we did pretty well, much better than I expected last year when we were in the middle of what looked like a losing battle with spammers. I think we (and by that I mean Anton, first and foremost) managed to pretty much keep spam under control without imposing unnecessary restrictions on new users. LtU was always welcoming to new users, and we strive hard to keep it that way - which leads to the second issue...

We were worried that with too many new members signing up the quality of discussions will go down, and the atmosphere of the site will change. While this happened to some extent from time to time, I think that in general most discussions remain as informative as ever - in fact, some have become too highbrow even for me... Quite a few new members have become regulars, and even contributing editors. I am very glad to see this happen, for all the obvious reasons. I am especially glad to see that we stopped attracting so many "drive-by members" who sign up only to ask one question and are then gone. LtU is not the best place for such questions, which in the past also proved to be mostly off-topic. New members, on the other hand, always add something new to the community.

Indeed, what is truly phenomenal and inspiring about LtU for me is not the continuity of the site, but the continuity of the community. I've been saying this every year, I think, but it is worth repeating. What gives LtU its unique flavor are the many members that have been part of the community for many years, some from the very early days. While not an online sewing circle, I think LtU does encourage long time members to become contributing editors, to share their interests more explicitly, and even to mention from time to time, if they so wish, their own changing circumstances. This is a professional community, but a community none the less.

The community is what holds LtU together, but it is hidden in the forum. The public face of LtU, and what should be the main focus of the discussion on the site, are the home page news items. From the early days there has been a tension between the forum and the blog aspects of LtU: while ideally all good links should be on the home page, and these should be the focus of discussion, quite often the forum takes on a life of its own. Since some members do not follow the forum discussions closely, and since the home page items are a good way to stir the direction of the site, this may be less the ideal. The solution, as always, is for the contributing editors (who are those members who manage the home page group blog) to be more active, and for more members to become contributing editors. If you are a regular, and think that you can contribute semi-regularly (i.e., as often as you want) to one or more of the LtU departments, you should consider signing up.


Let me conclude this rambling message on a more personal note. This year I moved to Menlo Park, California. Hearing from and meeting local LtU members made the move easier. It was a great experience to encounter people in various places who recognized my name, and asked "Are you the guy from LtU?" (the next question usually being "So what do you say about Scala?", by the way). Being extremely busy I didn't take advantage of all the activities around here this year, but the one time I did manage to go to a BayFP meeting was great fun.

Who knows what the next year will bring? For now, thank you all for your participation. You are LtU!

Practical Set Theory

Steven Kieffer, Jeremy Avigad, Harvey Friedman (2008). A language for mathematical knowledge management. Carnegie-Mellon tech. report CMU-PHIL-181.

The authors present a compact language for articulating mathematics, PST, which is syntactic sugar for an extension of Zermelo-Frankel set theory, DZFC, previously proposed by Harvey Friedman, which allows the naming of partially defined functions. The article gives some examples which show how for some mathematics, PST allows rather straightfoward expression. The article also gives some statistics gathered from an encoding of Suppes' Set Theory and Munkres' Topology, showing how PST allows a drastic compression compared to plain first-order ZFC.

A System to Understand Incorrect Programs

An ancient paper (July 1978: 30 years ago) from the long gone Lisp Bulletin by Harald Wertz.

The system describes attempts to improve incompletely specified Lisp programs, without however resorting to more information, in the form of specifications, test cases or the like.

A second paper on the system is Stereotyped Program Debugging: an Aid for Novice Programmers.

Partial vectorisation of Haskell programs

Partial vectorisation of Haskell programs. Manuel M. T. Chakravarty, Roman Leshchinskiy, Simon Peyton Jones, and Gabriele Keller, Proc ACM Workshop on Declarative Aspects of Multicore Programming, San Francisco, Jan 2008.

Vectorisation for functional programs, also called the flattening transformation, relies on drastically reordering computations and restructuring the representation of data types. As a result, it only applies to the purely functional core of a fully-fledged functional language, such as Haskell or ML. A concrete implementation needs to apply vectorisation selectively and integrate vectorised with unvectorised code. This is challenging, as vectorisation alters the data representation, which must be suitably converted between vectorised and unvectorised code. In this paper, we present an approach to partial vectorisation that selectively vectorises sub-expressions and data types, and also, enables linking vectorised with unvectorised modules.

The idea is fairly simple, and utilizes conversion between vectorized and unvectorized representations of the datatypes. A formal translation scheme is provided.

Data Parallel Haskell papers are here.

The Development of Intuitionistic Logic

Mark van Atten (2008). The Development of Intuitionistic Logic. Stanford Encyclopedia of Philosophy.

This article gives an excellent account of the development of intuitionistic logic, from its roots in Brouwer's theological metaphysics, through to its formal presentation by Heyting in 1956. The account is strong on the tensions between the subjectivist motif and the urge to formalise. Via Richard Zach.

Ada, the Ultimate Lambda?

Chris Oakasaki has a blog post about teaching functional programming using Ada. He says

Why, with excellent functional programming languages like Haskell or ML easily available, would I choose to do a functional programming project in Ada? (I CAN STILL HEAR YOU LAUGHING!) Partly because we use Ada in some courses at our institution, and I wanted a library that could help in teaching recursion to novices. But, honestly, mostly I just wanted to see if it could be done

The idea of crossing paradigms has been around awhile. SICP rather famously introduces object orientation by building it on top of Scheme.

What do you think about teaching or learning paradigm A in a language strongly oriented towards paradigm B? What's gained? What's lost?

Catch me if you can: Towards type-safe, hierarchical, lightweight, polymorphic and efficient error management in OCaml

Catch me if you can: Towards type-safe, hierarchical, lightweight, polymorphic and efficient error management in OCaml, by David Teller, Arnaud Spiwack, Till Varoquaux:

This is the year 2008 and ML-style exceptions are everywhere. Most modern languages, whether academic or industrial, feature some variant of this mechanism. Languages such as Java even have a degree of out-of-the-box static coverage-checking for such exceptions, which is currently not available for ML languages, at least not without resorting to external tools.

In this document, we demonstrate a design principle and a tiny library for managing errors in a functional manner, with static coverage-checking, automatically-inferred, structurally typed and hierarchical exceptional cases, all of this for what we believe is a reasonable run-time cost. Our work is based on OCaml and features simple uses of higher-order programming, low-level exceptions, phantom types, polymorphic variants and compile-time code
rewriting.

Exhaustively checked, user-friendly exception handling was a bit of an open problem for awhile. As the paper details, languages supported either cumbersome, exhaustively checked polymorphic exceptions, as in Haskell, or we had unchecked easily extensible monomorphic exceptions, as in ML, or we had checked, extensible exceptions using a universal type as in Java.

Supporting exhaustively checked, easily extensible polymorphic exceptions seemed quite a challenge, which this paper solves using monadic error handling and nested polymorphic variants. The paper also gives a good overview of current techniques of exception checking in OCaml, ie. ocamlexc.

The performance of such exceptions is understandably lower than native exceptions, given all the thunking and indirection that monads entail. The authors attempt various implementations and test their performance against native exceptions. Ultimately, monadic error management seems acceptable for actual error handling, but not for control flow as native exceptions are sometimes used in OCaml.

One interesting extension is to consider how efficient the implementations would be given more sophisticated control flow operators, such as continuations, coroutines, or delimited continuations, or whether native exceptions can be salvaged using a type and effects system in place of monads.